electronic structure
SMILES-Inspired Transfer Learning for Quantum Operators in Generative Quantum Eigensolver
Yin, Zhi, Li, Xiaoran, Zhang, Shengyu, Li, Xin, Zhang, Xiaojin
Given the inherent limitations of traditional Variational Quantum Eigensolver(VQE) algorithms, the integration of deep generative models into hybrid quantum-classical frameworks, specifically the Generative Quantum Eigensolver(GQE), represents a promising innovative approach. However, taking the Unitary Coupled Cluster with Singles and Doubles(UCCSD) ansatz which is widely used in quantum chemistry as an example, different molecular systems require constructions of distinct quantum operators. Considering the similarity of different molecules, the construction of quantum operators utilizing the similarity can reduce the computational cost significantly. Inspired by the SMILES representation method in computational chemistry, we developed a text-based representation approach for UCCSD quantum operators by leveraging the inherent representational similarities between different molecular systems. This framework explores text pattern similarities in quantum operators and employs text similarity metrics to establish a transfer learning framework. Our approach with a naive baseline setting demonstrates knowledge transfer between different molecular systems for ground-state energy calculations within the GQE paradigm. This discovery offers significant benefits for hybrid quantum-classical computation of molecular ground-state energies, substantially reducing computational resource requirements.
- Asia > China > Zhejiang Province > Ningbo (0.04)
- Asia > China > Jiangsu Province > Nanjing (0.04)
- Asia > China > Guangdong Province > Shenzhen (0.04)
- Research Report > Promising Solution (0.34)
- Overview > Innovation (0.34)
Distributed Equivariant Graph Neural Networks for Large-Scale Electronic Structure Prediction
Kaniselvan, Manasa, Maeder, Alexander, Xia, Chen Hao, Ziogas, Alexandros Nikolaos, Luisier, Mathieu
Equivariant Graph Neural Networks (eGNNs) trained on density-functional theory (DFT) data can potentially perform electronic structure prediction at unprecedented scales, enabling investigation of the electronic properties of materials with extended defects, interfaces, or exhibiting disordered phases. However, as interactions between atomic orbitals typically extend over 10+ angstroms, the graph representations required for this task tend to be densely connected, and the memory requirements to perform training and inference on these large structures can exceed the limits of modern GPUs. Here we present a distributed eGNN implementation which leverages direct GPU communication and introduce a partitioning strategy of the input graph to reduce the number of embedding exchanges between GPUs. Our implementation shows strong scaling up to 128 GPUs, and weak scaling up to 512 GPUs with 87% parallel efficiency for structures with 3,000 to 190,000 atoms on the Alps supercomputer.
- Europe > Switzerland > Zürich > Zürich (0.15)
- North America > United States > Georgia > Fulton County > Atlanta (0.04)
Materials Learning Algorithms (MALA): Scalable Machine Learning for Electronic Structure Calculations in Large-Scale Atomistic Simulations
Cangi, Attila, Fiedler, Lenz, Brzoza, Bartosz, Shah, Karan, Callow, Timothy J., Kotik, Daniel, Schmerler, Steve, Barry, Matthew C., Goff, James M., Rohskopf, Andrew, Vogel, Dayton J., Modine, Normand, Thompson, Aidan P., Rajamanickam, Sivasankaran
We present the Materials Learning Algorithms (MALA) package, a scalable machine learning framework designed to accelerate density functional theory (DFT) calculations suitable for large-scale atomistic simulations. Using local descriptors of the atomic environment, MALA models efficiently predict key electronic observables, including local density of states, electronic density, density of states, and total energy. The package integrates data sampling, model training and scalable inference into a unified library, while ensuring compatibility with standard DFT and molecular dynamics codes. We demonstrate MALA's capabilities with examples including boron clusters, aluminum across its solid-liquid phase boundary, and predicting the electronic structure of a stacking fault in a large beryllium slab. Scaling analyses reveal MALA's computational efficiency and identify bottlenecks for future optimization. With its ability to model electronic structures at scales far beyond standard DFT, MALA is well suited for modeling complex material systems, making it a versatile tool for advanced materials research.
- North America > United States > New York > New York County > New York City (0.14)
- North America > United States > Tennessee > Knox County > Knoxville (0.14)
- North America > United States > California > San Francisco County > San Francisco (0.14)
- (11 more...)
- Materials (1.00)
- Government > Regional Government > North America Government > United States Government (1.00)
- Energy (1.00)
Multi-task learning for molecular electronic structure approaching coupled-cluster accuracy
Tang, Hao, Xiao, Brian, He, Wenhao, Subasic, Pero, Harutyunyan, Avetik R., Wang, Yao, Liu, Fang, Xu, Haowei, Li, Ju
Machine learning (ML) plays an important role in quantum chemistry, providing fast-to-evaluate predictive models for various properties of molecules. However, most existing ML models for molecular electronic properties use density functional theory (DFT) databases as ground truth in training, and their prediction accuracy cannot surpass that of DFT. In this work, we developed a unified ML method for electronic structures of organic molecules using the gold-standard CCSD(T) calculations as training data. Tested on hydrocarbon molecules, our model outperforms DFT with the widely-used hybrid and double hybrid functionals in computational costs and prediction accuracy of various quantum chemical properties. As case studies, we apply the model to aromatic compounds and semiconducting polymers on both ground state and excited state properties, demonstrating its accuracy and generalization capability to complex systems that are hard to calculate using CCSD(T)-level methods.
- Materials > Chemicals > Commodity Chemicals > Petrochemicals (1.00)
- Energy > Oil & Gas > Downstream (1.00)
Autonomous microARPES
Agustsson, Steinn Ymir, Jones, Alfred J. H., Curcio, Davide, Ulstrup, Søren, Miwa, Jill, Mottin, Davide, Karras, Panagiotis, Hofmann, Philip
Angle-resolved photoemission spectroscopy (ARPES) is a technique used to map the occupied electronic structure of solids. Recent progress in X-ray focusing optics has led to the development of ARPES into a microscopic tool, permitting the electronic structure to be spatially mapped across the surface of a sample. This comes at the expense of a time-consuming scanning process to cover not only a three-dimensional energy-momentum ($E, k_z, k_y$) space but also the two-dimensional surface area. Here, we implement a protocol to autonomously search both $\mathbf{k}$- and real space in order to find positions of particular interest, either because of their high photoemission intensity or because of sharp spectral features. The search is based on the use of Gaussian process regression and can easily be expanded to include additional parameters or optimisation criteria. This autonomous experimental control is implemented on the SGM4 micro-focus beamline of the synchrotron radiation source ASTRID2.
- Europe > Denmark > Central Jutland > Aarhus (0.04)
- Asia > Japan > Honshū > Chūbu > Ishikawa Prefecture > Kanazawa (0.04)
Self-consistent Validation for Machine Learning Electronic Structure
Hu, Gengyuan, Wei, Gengchen, Lou, Zekun, Torr, Philip H. S., Ouyang, Wanli, Zhong, Han-sen, Lin, Chen
Shanghai Artificial Intelligence Laboratory and Department of Engineering, University of Oxford (Dated: February 16, 2024) Machine learning has emerged as a significant approach to efficiently tackle electronic structure problems. Despite its potential, there is less guarantee for the model to generalize to unseen data that hinders its application in real-world scenarios. To address this issue, a technique has been proposed to estimate the accuracy of the predictions. This method integrates machine learning with self-consistent field methods to achieve both low validation cost and interpret-ability. This, in turn, enables exploration of the model's ability with active learning and instills confidence in its integration into real-world studies.
- Asia > China > Shanghai > Shanghai (0.24)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.24)
- North America > United States > New York (0.04)
- (2 more...)
Universal Machine Learning Kohn-Sham Hamiltonian for Materials
Zhong, Yang, Yang, Jihui, Xiang, Hongjun, Gong, Xingao
While density functional theory (DFT) serves as a prevalent computational approach in electronic structure calculations, its computational demands and scalability limitations persist. Recently, leveraging neural networks to parameterize the Kohn-Sham DFT Hamiltonian has emerged as a promising avenue for accelerating electronic structure computations. Despite advancements, challenges such as the necessity for computing extensive DFT training data to explore new systems and the complexity of establishing accurate ML models for multi-elemental materials still exist. Addressing these hurdles, this study introduces a universal electronic Hamiltonian model trained on Hamiltonian matrices obtained from first-principles DFT calculations of nearly all crystal structures on the Materials Project. We demonstrate its generality in predicting electronic structures across the whole periodic table, including complex multi-elemental systems. By offering a reliable efficient framework for computing electronic properties, this universal Hamiltonian model lays the groundwork for advancements in diverse fields related to electronic structures.
Machine Learning 1- and 2-electron reduced density matrices of polymeric molecules
Pekker, David, Liang, Chungwen, Pattanayak, Sankha, Mukhopadhyay, Swagatam
Creyon Bio, 3210 Merryfield Row San Diego, CA 92121 Encoding the electronic structure of molecules using 2-electron reduced density matrices (2RDMs) as opposed to many-body wave functions has been a decades-long quest as the 2RDM contains sufficient information to compute the exact molecular energy but requires only polynomial storage. We focus on linear polymers with varying conformations and numbers of monomers and show that we can use machine learning to predict both the 1-electron and the 2-electron reduced density matrices. Moreover, by applying the Hamiltonian operator to the predicted reduced density matrices we show that we can recover the molecular energy. Thus, we demonstrate the feasibility of a machine learning approach to predicting electronic structure that is generalizable both to new conformations as well as new molecules. At the same time our work circumvents the N-representability problem that has stymied the adaption of 2RDM methods, by directly machine-learning valid Reduced Density Matrices. Specifically, we show that all desired 1-and 2-theory (DFT) and coupled-clusters methods, are electron correlations can be predicted at any level of theory key to ab initio understanding of molecular properties. However, these methods are slow. Specifically, currently considered to be the gold standard of quantum the sequence of n-electron reduced density matrices (n-chemistry it still involves major approximations which RDMs) forms a hierarchy of complexity that encodes preclude it from describing strongly correlated systems correlations between more and more electrons as n increase. Nevertheless, For example the 2RDM, which is obtained by making use of the fact that quantum correlations are essentially tracing the full electronic reduced density matrix over local, i.e. the quantum nearsightedness principle all electron coordinates but 2, encodes correlations between [2, 3], the latest generation of quantum chemistry 2 electrons.
New connections between quantum computing and machine learning in computational chemistry
Quantum computing promises to improve our ability to perform some critical computational tasks in the future. Machine learning is changing the way we use computers in our present everyday life and in science. It is natural to seek connections between these two emerging approaches to computing, in the hope of reaping multiple benefits. The search for connecting links has just started, but we are already seeing a lot of potential in this wild, unexplored territory. We present here two new research articles: "Precise measurement of quantum observables with neural-network estimators," published in Physical Review Research, and "Fermionic neural-network states for ab-initio electronic structure," published in Nature Communications.
Deepmind Strikes Back, Now Tackling Quantum Mechanical Calculations
Alphabet's Deepmind's success on biological chemistry with AlphaFold is by now well established as I presented and discussed in previous stories, paving a new era of biological chemistry, biophysics, and well… biology in general. Now, the AI company just struck back, this time tackling so-called "quantum mechanical (QM) calculations", which deal with the highest possible "resolution" when studying chemistry: electron densities, distributions, and spin states--the key elements modulating reactivity and bulk properties. It turns out that the structure and reactivity of molecules (small, big, or "infinite" as in blocks of materials; organic, inorganic or mixed) is determined by their electronic structures, which of course depend on 3D structures but are not just about atom distribution, rather more about electron distribution. To be more precise, it's not even just about how electrons distribute, but also relates to their spin states (i.e. These "electronic structures" of molecules can globally be described by the Schrödinger equation, through so-called "electron wavefunctions" which are essentially mathematical descriptions of the probability of finding an electron in a particular location in space.